Defect Prediction Using Akaike and Bayesian Information Criterion

نویسندگان

چکیده

Data available in software engineering for many applications contains variability and it is not possible to say which variable helps the process of prediction. Most work present defect prediction focused on selection best techniques. For this purpose, deep learning ensemble models have shown promising results. In contrast, there are very few researches that deals with cleaning training data parameter values from data. Sometimes high may cause a decrease model accuracy. To deal problem we used Akaike information criterion (AIC) Bayesian (BIC) variables train model. A simple ANN one input, output two hidden layers was instead complex AIC BIC calculated combination minimum be selected At first, were narrowed down smaller number using correlation values. Then subsets all combinations formed. end, an artificial neural network (ANN) trained each subset basis smallest value. It found only variables’ ns entropy as gives While, nm npt worst maximum

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Exponential Smoothing and the Akaike Information Criterion

Using an innovations state space approach, it has been found that the Akaike information criterion (AIC) works slightly better, on average, than prediction validation on withheld data, for choosing between the various common methods of exponential smoothing for forecasting. There is, however, a puzzle. Should the count of the seed states be incorporated into the penalty term in the AIC formula?...

متن کامل

An improved Akaike information criterion for state-space model selection

Following the work of Hurvich, Shumway, and Tsai (1990), we propose an “improved” variant of the Akaike information criterion, AICi, for state-space model selection. The variant is based on Akaike’s (1973) objective of estimating the Kullback-Leibler information (Kullback 1968) between the densities corresponding to the fitted model and the generating or true model. The development of AICi proc...

متن کامل

An Akaike information criterion for multiple event mixture cure models

We derive the proper form of the Akaike information criterion for variable selection for mixture cure models, which are often fit via the expectation-maximization algorithm. Separate covariate sets may be used in the mixture components. The selection criteria are applicable to survival models for right-censored data with multiple competing risks and allow for the presence of an insusceptible gr...

متن کامل

Extending the Akaike Information Criterion to Mixture Regression Models

We examine the problem of jointly selecting the number of components and variables in finite mixture regression models. We find that the Akaike information criterion is unsatisfactory for this purpose because it overestimates the number of components, which in turn results in incorrect variables being retained in the model. Therefore, we derive a new information criterion, the mixture regressio...

متن کامل

Mixture structure analysis using the Akaike Information Criterion and the bootstrap

JEFFREY L. SOLKA*, EDWARD J. WEGMAN, CAREY E. PRIEBE, WENDY L. POSTON and GEORGE W. ROGERS Dahlgren Division of the Naval Surface Warfare Center, Systems Research and Technology Department, Advanced Computation Technology Group, Code B10, Dahlgren VA 22448-5100, USA Center for Computational Statistics, George Mason University, Fairfax, VA 22030-4444, USA Department of Mathematical Sciences, The...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computer systems science and engineering

سال: 2022

ISSN: ['0267-6192']

DOI: https://doi.org/10.32604/csse.2022.021750